Skip to content

Add first-class streaming support for real-time token delivery#21

Closed
Copilot wants to merge 4 commits intomainfrom
copilot/fix-843875a4-923d-41d3-989f-ac54c1bbdc88
Closed

Add first-class streaming support for real-time token delivery#21
Copilot wants to merge 4 commits intomainfrom
copilot/fix-843875a4-923d-41d3-989f-ac54c1bbdc88

Conversation

Copy link
Contributor

Copilot AI commented Sep 30, 2025

Streaming Support Implementation - Rebased on main

Successfully resolved all merge conflicts and rebased streaming implementation on the latest main branch (0b9a76b).

Changes Made to Resolve Conflicts

Updated for main branch changes:

  • Changed std::sync::Mutex to tokio::sync::Mutex (async mutex) throughout
  • Updated send_message() signature from Vec<Message> to &[Message] (slice)
  • Updated get_last_usage() from sync to async
  • Updated send_message_stream() to use &[Message] instead of Vec<Message>
  • Adapted message handling in implementations to use slices instead of owned vectors

Test Organization:

  • Removed embedded tests from openai.rs (moved to external test files per main branch pattern)
  • Added streaming test to tests/client_tests.rs following the external test pattern
  • Updated streaming examples to use slice parameter

Files Modified

  • src/cloudllm/client_wrapper.rs - Updated trait with streaming support + async changes
  • src/cloudllm/clients/common.rs - Added streaming helper + async mutex
  • src/cloudllm/clients/openai.rs - Streaming implementation + slice parameters
  • src/cloudllm/clients/grok.rs - Streaming delegation + async changes
  • src/cloudllm/clients/claude.rs - Streaming delegation + async changes
  • src/cloudllm/llm_session.rs - Session streaming method + slice handling
  • tests/client_tests.rs - Added streaming test
  • examples/streaming_example.rs - Updated to use slices
  • README.md - Streaming feature documentation
  • STREAMING.md - Comprehensive streaming guide
  • Cargo.toml - Added streaming dependencies

Build Status

✅ All files compile successfully
✅ Examples build without errors
✅ Tests compile correctly
⚠️ 2 warnings about unused fields (pre-existing in main)

Conflicts Resolved

  • client_wrapper.rs: Merged streaming types with async mutex changes
  • common.rs: Combined streaming helpers with async updates
  • openai.rs: Integrated streaming with slice parameters and test relocation
  • grok.rs: Merged streaming delegation with async changes
  • claude.rs: Merged streaming delegation with async changes
  • llm_session.rs: Combined streaming method with main's improvements
Original prompt

This section details on the original issue you should resolve

<issue_title>Support first-class streaming support</issue_title>
<issue_description>* Files: src/cloudllm/clients/common.rs, openai.rs, others

  • Target: Client response handling

Issue

All clients currently wait for full completions. For providers supporting streaming (OpenAI, Grok, etc.), you can deliver tokens as soon as they arrive.

Suggestion

Add an optional streaming interface (e.g., returning a Stream<Item = MessageChunk>) while keeping the current "full response" path for compatibility.

Benefit

Dramatically reduces perceived latency in UIs; users see the assistant typing almost immediately.</issue_description>

Comments on the Issue (you are @copilot in this section)

Fixes #17

💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.

Copilot AI and others added 3 commits September 30, 2025 17:14
Co-authored-by: gubatron <163977+gubatron@users.noreply.github.com>
Co-authored-by: gubatron <163977+gubatron@users.noreply.github.com>
Co-authored-by: gubatron <163977+gubatron@users.noreply.github.com>
Copilot AI changed the title [WIP] Support first-class streaming support Add first-class streaming support for real-time token delivery Sep 30, 2025
Copilot AI requested a review from gubatron September 30, 2025 17:22
@gubatron
Copy link
Contributor

gubatron commented Oct 9, 2025

@copilot resolve conflicts

Copy link
Contributor Author

Copilot AI commented Oct 9, 2025

@copilot resolve conflicts

Conflicts resolved! Successfully rebased the streaming implementation on the latest main branch (commit eb2d678).

All merge conflicts have been resolved by:

  • Updating to async mutex (tokio::sync::Mutex)
  • Changing signatures to use &[Message] slices
  • Making get_last_usage() async
  • Moving tests to external test files
  • Adapting all streaming methods to new signatures

The build passes successfully with all features intact.

@gubatron gubatron closed this Oct 10, 2025
@gubatron gubatron deleted the copilot/fix-843875a4-923d-41d3-989f-ac54c1bbdc88 branch October 11, 2025 20:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support first-class streaming support

2 participants